Title

Pizza Deliveries

Author

David Wright
American River College, Geography 350: Data Acquisition in GIS; Fall 2008
Contact Information: wrightdm@imail.losrios.edu

Abstract

     My time as a delivery driver has taught that accurate maps are essential for a successful delivery service. By utilizing a variety of information sources, including GPS, shapefiles and imagery, I was able to successfully acquire the data needed to form the basis for a new set of maps for the store I work at.

Introduction

     I have been working as a delivery driver for a RoundTable franchise company for a number of years now. There have been many occasions over the years where customer service has suffered because of inadequacies with our current maps. We currently use a Thomas Guide map book. However, TG has certain defects I wish to address in my project. Notably, errors in the street data, inadequate block numbering and certain cartographic deficiencies. My goal for this project is to acquire the data needed to produce a set of maps and tables that will help drivers efficiently complete their deliveries. In addition to the traditional style maps similar to TG, I wish to do a set of network analysis maps which illustrate the cost of adding a stop to a route. Think of a service area for a route. Hopefully, this visual argument will help settle certain disputes that arise about best routing practices.

Background

     Several years ago, when I first started taking GIS classes at ARC, I found I had an opportunity to solve a problem I had had for some time, namely other drivers constantly asking me - "Which apartment complex is this??" or "Which end of the street is this house on??"(Some streets have breaks). Another problem, although somewhat rare, is the occasional overly excitable driver who is too easily frustrated when they can't find an address. To be fair, part of the problem is with the Thomas Guide maps we use. They have errors. They show streets that do not exist, fail to show breaks in streets or show breaks that do not exist, or fail to show streets that do exist. To be fair to Thomas Guide, the street data I have downloaded from Sacramento County have many of the same problems. And finally, although Thomas Guide shows parks, golf courses, schools, cemeteries, and some shopping centers, it does not include apartments, mobile home parks or quadplexes.

     I acquired and edited the data I would need from SACOG(SACOG 2004), DOQQs, and some GPS sessions (mostly for certain problem houses). I built my own database, and finally, created a map book for the store that was reproduced and distributed to the other drivers by the management. Two years ago I transferred to another store in northwest Roseville. Because of the problems with Thomas Guide as listed above, if not as many, I am working on a new map book. To do this, I need street and parcel data for the cities of Roseville and Rocklin, and Placer county, and property maps for various apartment complexes in our delivery area.

     I have already acquired most of the property maps, although they need to be edited for content and clarity. The city of Roseville has available for download street and parcel data. The street data seems to be complete. The parcel data, however, does not include address or zoning information. Fortunately, they do provide a zoning book in PDF format that contains the requisite information. The city of Rocklin and Placer county are another problem. They do not have any information available for download that I can directly import into a GIS. Placer does have an internet mapping site that I can get images of the data, tile them together and convert to vector based feature classes. They do not, however, have a handy zoning book to fill in the needed attribute data. That will have to be done manually. I am also looking to build a table of property and business names and suite numbers to link with parcel addresses. Property names are useful for delivering to a property the customer knows the name of, but not the address.

Methods

     For this project certain conditions were maintained for all data imported into my geodatabase. First, all of the city of Roseville(Roseville 2008), Sacramento County(Sacramento 2008) and Placer County SACOG(SACOG 2004) data have defined projections of California State Plane Zone II, NAD83. Second, most, if not all of the feature classes have data that I do not want or need cluttering up my database. Yet, it is possible that I might need this data in the future. To this end, all features imported will have links describing which table they came from and which record was imported. The first part of this linkage consists of a lookup table detailing the name, storage location, acquisition date, etc., for the feature class. The Second part of the link stores the value of the unique ID of the feature. Finally, the software used for this project includes ESRI(ESRI 2005) GIS products, Visual Studio(Microsoft 2005) and Excel(Microsoft 2007). The following graphic shows my primary area of interest for this project


Areas Of Interest

     Starting with the Roseville, Sacramento and SACOG street data, I imported the shapefiles into my geodatabase. My first task then was to integrate the street class codes into a single set of codes. SACOG uses CFCC, while Sacramento and Roseville use their own codes. I then went about seaming the feature classes together. There was a certain amount of overlap and incongruities at the borders. In addition, Roseville is a part of Placer county. This means that all of the Roseville data is overlapped by the Placer data. The Roseville data, however, is newer and far more complete. So, except for a few miscellaneous features, all Placer data overlapping Roseville was excluded from import.

     I then proceeded to import the parcel data from Sacramento and Roseville. SACOG had none. Like the streets data, I first needed to convert their land use codes for my own use. Sacramento's codes were far too detailed for what I needed, and Roseville had none in their parcel data. They did have some secondary special purpose shapefiles for parks, schools and some commercial properties from which I was able to derive a limited amount of attribute information and join to the main parcels feature class. In order to fill out the remaining address and land use fields, I had to manually copy the information from a Zoning Book PDF that I was able to download from their site. To aid me in this task, I wrote an UI Tool and form for entering the address and land use information. This tool works by sorting the spatially selected records into logical address order. Starting with the first record, the remaining records are searched to find the one touching it. It becomes the next first record and the remaining records are searched. When all records have been processed, the ordered records are processed one at a time and updated with a new address that is incremented by a defined amount for each record. After each fill, I would verify that the numbers were correct, and periodically turn on a layer that was symbolized by unique street names. This would visually indicate data dropouts or misinformation. This happened a few times.


Parcel Address Tool

     Acquiring the needed data from Placer County(Placer 2007) was a bit more involved. Firstly their data was composed entirely of imagery. The images of the parcels and streets were also in color. This is unsuitable for ArcScan which requires black and white. Finally, the geographic extent needed was greater than the size of the largest single image I was able to download for the desired spatial resolution of no more than 2 feet per pixel. I had to tile them together. I observed that the way their web page sent requests to the server was by passing parameters via the URL. By manually manipulating these parameters I could precisely define the image size and geographic extent for each layer tile requested. For each area of interest I ended up with three layers consisting of 1 or more tiles. Because the parameters had been precisely defined, I could write a small program that would automatically merge the tiles into a single bitmap for each layer. Using information in ArcMap's help files on world files, I then used the defined geographic extents to manually create a world file for each layer. Since each layer for a given area of interest had identical extents, I could simply make copies of the world file for each.

Rocklin's world file: 2.0 0.0 0.0 -2.0 6754150.5 2070299.5
World File

     Once I had the images referenced I used the raster cleanup tools in ArcScan to prepare for scanning. There were numerous dangles around the borders of the merged images. After clean up, I used ArcScan to create line features from the images. For the parcel images, I then used the Feature to Polygon tool to create polygons of parcels, an ArcInfo license was needed for this. At this point I added appropriate fields to match the other Street and Parcels feature classes. The Placer GIS web site, in addition to providing images of parcels, also has a tool to query individual parcels for information such as their address. Using the tools discussed above I filled in the parcels address and land use fields of my acquired data.

     Although the streets data from Placer appears to be relatively current and complete, there is at least one street and some parking lots that were too new to show up in the imagery. In order to acquire this data , I used a Garmin 76S(Garmin 2008) GPS unit set to the appropriate projection and drove along the streets. I then downloaded the data using DNR Garmin(DNR Garmin 2008) software version 5.4.1 and imported the data into my database. I also acquired some readings for some speed bumps. Although not really needed for my basic mapping project, they will prove useful later for network analysis projects determining least time (not shortest distance) routings.

Results

     Although I have not yet finished processing the entire dataset. My minimum goals have been met. The following pictures illustrate the results of my work.

Parcel addresses symbolized by unique name
Parcels addressed

Uncleaned streets
Parcels addressed

Cleaned streets
Parcels addressed


GPS Street Points
GPS Street Points


Edited Street Points
Edited Street Points

Analysis

     Although my efforts have been generally successful, certain problems did crop up. First of all, the parcel data I downloaded from Sacramento and Roseville had duplicate APN's in their respective datasets. This means that the next time I download updates for the data I will not be able to directly compare different versions of some features to check for changes. They will need special processing. Roseville seems to be the worst offender in this regard. They actually have duplicate parcels stacked on top of each other. I did a field check of one of these developments to see what was going on, but the buildings were still under construction. I suspect they may be multi story condos with each floor under the control of a different owner. We'll see.

     The next problem came when I discovered that Zoning Map Book Roseville provided has multiple addresses listed for certain parcels(corner lots). I have checked some of them to see which address is correct. It turns out some of these houses are duplexes, although they do not look like typical duplexes I have seen elsewhere. The others are a mystery. Some of the houses I have checked have only one address so far as I can determine. I will need to check the rest to verify the correct address.

     When I first downloaded the parcel and street images from Placer I discovered that they were multicolored and therefore not suitable for ArcScan. I modified my image tiling program to clamp colors that were closer to black than a specified threshold to black; and colors that were closer to white than the threshold, to white.

Parcel Color Samples
Parcel Color Samples

Parcel Color Values
Parcel Color Values

Colors Clamped to Black or White
Colors Clamped to Black or White

     The streets imagery had annotaion embedded in it. I decided in the end that cleaning up the images would probably take as much work as digitizing them by hand. So I did.


Streets Annotaion

     The parcels in the Baseline area had some alignment problems. The older neighborhoods looked fine, but the newer areas are seriously misaligned. I have decided to indefinitely postpone processing them for the time being. The immediate need was marginal and I have other work with higher priorities.


Misaligned parcels

     ArcScan did an acceptable job of creating line features from the images, but lines that should have been straight ended up stair casing. Although from a cartographic standpoint this undesirable, in terms of accuracy, the features that should be there, are, and they are readily identifiable as to being what they are supposed to be. In addition, my original target for a pixel resolution of 2 feet turned out to be too course for one area. The features were too closely spaced for ArcScan to correctly process and many internal dangles were created. I believe a 1 foot resolution would be sufficient to process this area. I originally decided against a 1 foot resolution because it would have quadrupled my workload. That decision appears to have been a mistake, at least for this one small area.

     Finally, while processing the data acquired from the GPS unit I was using, I discovered that I had muffed setting up the projection for my data and the GPS unit was no longer available. I was unable use the data as linear tracks. I was, however, able to import the point data of the tracks into ArcMap as XY Data through the Tools menu. This was readily done by using the LAT and LONG fields as my coordinates instead of the X_PROJ and Y_PROJ fields of unknown projection. I then applied the GCS_WGS_1984 coordinate system as a spatial reference. The data then lined nicely with my existing data and I was able use it as a guide to digitize new street features. I had set the GPS unit to get a point once a second. This proved useful for curves, but was far too much data for straight-aways. I would have had to edit the data anyways. This is especially true for the parking lots where in several places, I went over some sections more than once.

Conclusions

     The Streets feature classes appear ready for use, and I am eager try running some network analysis problems with them. Although by themselves the street data is ready for mapmaking for the maps I want to make, I still need to produce certain tools and techniques for managing the data. As an example, in the past when splitting a street, I have discovered that the original line is destroyed and that two new lines are created with their own unique IDs. Any annotation which may be attached to the original line will become orphaned. I need a way to detect when this happens and reattach the annotation. In addition, I also need more complete parcel information. I need to finish verifying the questionable parcel addresses and to get more business names and suite numbers and property names. I do not have many yet except for certain parks, schools and golf courses which Roseville had embedded in some of their secondary files. I will be working to acquire these in the coming months on an ad hoc basis.

     The property maps I had originally hoped to import have been scanned but beyond that I have not had the time to do anything else with them. These I originally acquired simply by going around to various apartment complexes and asking for them. The managers at these complexes were quite gracious in giving me these maps, when they had them. They all need cleaning to a certain extent. Some much more than others.

References

  • ESRI. ESRI.com The GIS Software Leader 2008. http://www.esri.com/
  • Roseville. City of Roseville, California 2008. http://www.roseville.ca.us/services/maps_n_data/data_clearinghouse.asp(accessed January 2008)
  • Sacramento. Sacramento County GIS 2008. http://www.msa.saccounty.net/gis/gis_gisdata.aspx (accessed January 2008)
  • SACOG. Sacramento Area Council Of Governments 2004. http://www.sacog.org/(accessed January 2004)
  • PlacerGIS. Placer County Online GIS. November 2008. http://lis.placer.ca.gov/gis.asp?s=1000&h2=628(accessed November 2008)
  • Microsoft. Visual Studio 2005. http://msdn.microsoft.com/en-us/vstudio/default.aspx
  • Microsoft. Excel for Microsoft Office 2007. http://office.microsoft.com/en-us/FX102855291033.aspx
  • Garmin. Garmin 2008. http://www.garmin.com/garmin/cms/site/us(accessed December 2008)
  • DNR Garmin. Minnesota Department of Natural Resources. 2001. http://www.dnr.state.mn.us/mis/gis/tools/arcview/extensions/DNRGarmin/DNRGarmin.html (accessed December 2008)